“I don’t even know if we have had our ‘Silent Spring’ yet”
May 6, 2018 11:26 AM   Subscribe

"Increasingly, though, individuals need to worry about another kind of privacy violation. I think of it as a modern tweak on the tragedy of the commons — call it “privacy of the commons.” It’s what happens when one person’s voluntary disclosure of personal information exposes the personal information of others who had no say in the matter. Your choices didn’t cause the breach. Your choices can’t prevent it, either. Welcome to a world where you can’t opt out of sharing, even if you didn’t opt in." [SL538]
There isn’t yet a clear answer for what the U.S. should do. Almost all of our privacy law and policy is framed around the idea of privacy as a personal choice, Cohen said. The result: very little regulation addressing what data can be collected, how it should be protected, or what can be done with it. In some ways, Gillmor said, online privacy is where the environmental movement was back in the 1950s, when lots of big, centralized choices were hurting individuals’ health, and individuals had little power to change that. “I don’t even know if we have had our ‘Silent Spring’ yet,” he said. “Maybe Cambridge Analytica will be our ‘Silent Spring.’”
posted by not_the_water (6 comments total) 29 users marked this as a favorite
 
Part of the problem is that the very groups that should have been fighting for us had been long since captured by the very organizations that are at the heart of the problem.
posted by NoxAeternum at 12:19 PM on May 6, 2018 [6 favorites]


I doubt Cambridge is the answer. In fact, I'd hazard, like so many things that call for "solutions" the cure may just be worse than the disease.

Especially when politicians can barely handle the concept of a social graph as an entity (vs the individual person).

Does "my rights extend to where your social graph ends"?

How deep do these connections need to be severed.
posted by symbioid at 12:20 PM on May 6, 2018 [2 favorites]


Instead, experts say these examples show that we need to think about online privacy less as a personal issue and more as a systemic one. Our digital commons is set up to encourage companies and governments to violate your privacy. If you live in a swamp and an alligator attacks you, do you blame yourself for being a slow swimmer? Or do you blame the swamp for forcing you to hang out with alligators?

This... is a really bad analogy.
posted by Grandysaur at 12:30 PM on May 6, 2018 [14 favorites]


There isn’t yet a clear answer for what the U.S. should do. Almost all of our privacy law and policy is framed around the idea of privacy as a personal choice, Cohen said.

Sorta seems like Europe via GDPR is going to pull the U.S. into its wake on this question, but it won't be enough to address the more profound long-term threat. I manage IT networking and IT services teams for a US-based B2B software company that has European employees and does business in Europe. GDPR has required us to more thoroughly document the user information we gather and where it gets used, and I've been surprised at what constitutes "user tracking." Before GDPR, I would have said we don't really "track" users. Personally, as much as GDPR is making my professional life a little harder, I'm grateful for it. It's an extreme comment from the EU that is forcing companies to reset on some things, and to the extent there's a lot of ambiguity and companies don't want to mess with the EU, it's forcing US companies to implement stuff they wouldn't if it weren't so aggressive. It will still not be enough for me.

The real menace of social networking is not the risk to the secrecy of our personal information, but rather the way in which interaction you have over those services is contributing to models that are used to manipulate people. Sometimes it's for dumb stuff, like trying to get me to pull the trigger on some premium slippers. Other times it's for important stuff, like getting me to believe a lie in order to manipulate my vote. This stuff is operating at such scale that my individual decision to have my personal data expunged from the system (and hence the model) probably doesn't really affect it, and I'd guess my demographic profile will live on, detached from my PII, contributing to the model anyhow. Some folks are minimizing the efficacy of those influence models. I don't know what to say to that, but it reads to me as hubris.

The way I see the conversation shaping up, there will be no significant resistance to this aspect of social networking. My college friends (GenXers) have largely surrendered to Facebook because it was a better option than the decade+ they spent using their corporate email addresses (that always made me cringe, but I was one of the small set of "techies" in our peer group who had the wherewithal to build my own services/hold my own domain). A few of them took me up on offers to either join an email server or web space I'd set up and committed to running (going on 20 years now), but most didn't.

Younger friends of mine simply don't consider these services negotiable, and their rhetoric is hardening in favor of "sucks being held hostage, but to drop these services is to be less than a fully participating human in my social scene." I've taken the hint and simply don't talk about it much anymore because they've signaled they don't care to discuss it any further. With the recent emotional labor/privilege framing Sarah Jeong put forth I think progressives are talking themselves out of any significant challenge to what these services do at that level. Jeong's essay doesn't pay any attention to that wider manipulation/influence mission, and instead focuses on information secrecy and surveillance. I think, though, that this is a case of the future being way more Huxley than Orwell. As much as I appreciate her thoughtfulness (and respect the idea that my ability to build my own Internet presence is a profound kind of privilege) I'm sorry she didn't talk about that. (Maybe she has elsewhere, corrections welcome.)

But I don't think GDPR is going to help with this piece, either. The parts that mandate notification of data gathered are discussed in terms of what PII is being gathered, not the underlying influence models that data is being used to build, and which no amount of data security actually addresses. The parts that address our right to be forgotten do not, as near as I can tell, go so deep that building those influence models will be untenable: Once you're captured as a type/profile, you've contributed to the model. Leaving the service after the fact doesn't mean you're taking the signal you contributed along with you, and there are millions more like you anyhow.
posted by mph at 12:34 PM on May 6, 2018 [14 favorites]


In fact, I'd hazard, like so many things that call for "solutions" the cure may just be worse than the disease.

This is what worries me. Usually, it happens when a tipping point is reached where people in authority have to do something. Even if they have, at best, a tenuous grasp of what it is we have to fix.

The real menace of social networking is not the risk to the secrecy of our personal information, but rather the way in which interaction you have over those services is contributing to models that are used to manipulate people. Sometimes it's for dumb stuff, like trying to get me to pull the trigger on some premium slippers. Other times it's for important stuff, like getting me to believe a lie in order to manipulate my vote.

This is indeed the challenge. Few people object when a vendor suggests purchases, especially when the suggestions fit. And the only time there are objections to getting people to believe a lie is when others are duped into believing them. It seems to me these data gatherers are awfully good at channeling lies to people that are already receptive to believing them. To put it simply, people are getting pretty much what they ask for. This is why it seems the whole thing is such a yawn for so many.
posted by 2N2222 at 6:59 PM on May 6, 2018 [1 favorite]


Can we put out the concept of fire?

There needs to be something at the level of a global first amendment. It needs to be formed to prevent harm to individuals and society. Then let litigation and statute work out the details. But fire data collection and analysis is not going back in the bottle, we can't unlearn and certainly can't stop ourselves. Information wants to be freed and everything about us is on a network somewhere and all the networks are connected.

Fine I can ask for my FB data to be erased, just that act of a request is data that's on the wire recorded by multiple agencies. FB has a slightly open picture of where data went but what about third and fourth levels, google gathering stats on Cambridge Analytica which feeds marketing databases at a slew of companies. Is the aggregate you? No, can a next gen deep learning algorithm extract your data from a bunch of aggregated db's? Maybe not today.

The very concept of privacy is archaic. We need protection, the right level of rights of responsibility from corporations and governments doing harm, but it's out of the bottle.
posted by sammyo at 7:29 PM on May 6, 2018


« Older A daring interpretative piece, part of a series of...   |   Be sure to like and subscribe Newer »


This thread has been archived and is closed to new comments